Skip to content
This repository was archived by the owner on Nov 27, 2024. It is now read-only.

Support for low memory devices #111

Merged
merged 4 commits into from
Feb 5, 2024
Merged

Support for low memory devices #111

merged 4 commits into from
Feb 5, 2024

Conversation

saddam213
Copy link
Member

@saddam213 saddam213 commented Feb 5, 2024

Add some memory profiles to give users with low end cards access to stable diffusion

2 Modes
Maximum = Keep all models cached in memory
Minimum = Unload each model after inference (slower)

Memory Benchmarks:

Model Maximum Minimum
StableDiffusion 7.2GB 5.4GB
StableDiffusion Olive 4GB 2.1GB
SDXL 27GB 18GB
SDXL Olive 14.2GB 7.2GB

Performance Benchmarks

StableDiffusion (30 steps) 1st Run 2nd Run 3rd Run
Maximum 16s 5.9s 5.7s
Minimum 16s 16s 16s
StableDiffusion Olive (30 steps) 1st Run 2nd Run 3rd Run
Maximum 8.9s 2.2s 1.1s
Minimum 9.0s 8.8s 8.8s
SDXL Olive (30 steps) 1st Run 2nd Run 3rd Run
Maximum 28s 8.6s 8.0s
Minimum 28s 28s 28s

@saddam213 saddam213 merged commit d83a59c into master Feb 5, 2024
@saddam213 saddam213 deleted the LowVram branch February 5, 2024 22:47
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant